An introduction to dimension reduction in nonparametric kernel regression

نویسندگان

  • Stephane Girard
  • Jerôme Saracco
چکیده

Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors and a response variable. However, when the number of predictors is high, nonparametric estimators may suffer from the curse of dimensionality. In this chapter, we show how a dimension reduction method (namely Sliced Inverse Regression) can be combined with nonparametric kernel regression to overcome this drawback. The methods are illustrated both on simulated datasets as well as on an astronomy dataset using the R software.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gradient-based kernel dimension reduction for regression

This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on a...

متن کامل

PCA-Kernel Estimation

Many statistical estimation techniques for high-dimensional or functional data are based on a preliminary dimension reduction step, which consists in projecting the sample X1, . . . ,Xn onto the first D eigenvectors of the Principal Component Analysis (PCA) associated with the empirical projector Π̂D. Classical nonparametric inference methods such as kernel density estimation or kernel regressio...

متن کامل

Nonparametric Regression Estimation under Kernel Polynomial Model for Unstructured Data

The nonparametric estimation(NE) of kernel polynomial regression (KPR) model is a powerful tool to visually depict the effect of covariates on response variable, when there exist unstructured and heterogeneous data. In this paper we introduce KPR model that is the mixture of nonparametric regression models with bootstrap algorithm, which is considered in a heterogeneous and unstructured framewo...

متن کامل

Localized regression on principal manifolds

We consider nonparametric dimension reduction techniques for multivariate regression problems in which the variables constituting the predictor space are strongly nonlinearly related. Specifically, the predictor space is approximated via “local” principal manifolds, based on which a kernel regression is carried out.

متن کامل

One for all and all for one : Dimension reduction for regression checks

We develop a novel dimension-reduction approach to consistent checks of parametric regression models when many regressors are present. The principle is to replace the nonparametric alternative by a class of semiparametric alternatives, namely single-index models, that is rich enough to allow detection of any nonparametric alternative. We propose an omnibus test based on the kernel method that p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017